Spiking Boltzmann Machines 1 Population Codes and Energy Landscapes
نویسنده
چکیده
We rst show how to represent sharp posterior probability distributions using real valued coe cients on broadly-tuned basis functions. Then we show how the precise times of spikes can be used to convey the real-valued coe cients on the basis functions quickly and accurately. Finally we describe a simple simulation in which spiking neurons learn to model an image sequence by tting a dynamic generative model. 1 Population codes and energy landscapes A perceived object is represented in the brain by the activities of many neurons, but there is no general consensus on how the activities of individual neurons combine to represent the multiple properties of an object. We start by focussing on the case of a single object that has multiple instantiation parameters such as position, velocity, size and orientation. We assume that each neuron has an ideal stimulus in the space of instantiation parameters and that its activation rate or probability of activation falls o monotonically in all directions as the actual stimulus departs from this ideal. The semantic problem is to de ne exactly what instantiation parameters are being represented when the activities of many such neurons are speci ed. Hinton, Rumelhart and McClelland (1986) consider binary neurons with receptive elds that are convex in instantiation space. They assume that when an object is present it activates all of the neurons in whose receptive elds its instantiation parameters lie. Consequently, if it is known that only one object is present, the parameter values of the object must lie within the feasible region formed by the intersection of the receptive elds of the active neurons. This will be called a conjunctive distributed representation. Assuming that each receptive eld occupies only a small fraction of the whole space, an interesting property of this type of \coarse coding" is that the bigger the receptive elds, the more accurate the representation. However, large receptive elds lead to a loss of resolution when several objects are present simultaneously. When the sensory input is noisy, it is impossible to infer the exact parameters of objects so it makes sense for a perceptual system to represent the probability distribution across parameters rather than just a single best estimate or a feasible region. The full probability distribution is essential for correctly combining infor-
منابع مشابه
Modeling Laminar Recordings from Visual Cortex with Semi-Restricted Boltzmann Machines
The proliferation of high density recording techniques presents us with new challenges for characterizing the statistics of neural activity over populations of many neurons. The Ising model, which is the maximum entropy model for pairwise correlations, has been used to model the instantaneous state of a population of neurons. This model suffers from two major limitations: 1) Estimation for larg...
متن کاملSpiking Boltzmann Machines
We rst show how to represent sharp posterior probability distributions using real valued coe cients on broadly-tuned basis functions. Then we show how the precise times of spikes can be used to convey the real-valued coe cients on the basis functions quickly and accurately. Finally we describe a simple simulation in which spiking neurons learn to model an image sequence by tting a dynamic gener...
متن کاملImplementation of a Restricted Boltzmann Machine in a Spiking Neural Network
Restricted Boltzmann Machines (RBMs) have been demonstrated to perform efficiently on a variety of applications, such as dimensionality reduction and classification. Implementing RBMs on neuromorphic hardware has certain advantages, particularly from a concurrency and lowpower perspective. This paper outlines some of the requirements involved for neuromorphic adaptation of an RBM and attempts t...
متن کاملUsing full parallel Boltzmann Machines for Optimization
It is well known that synchronously full parallel Boltzmann machines do not converge to the minima of the quadratic energy but to a so called pseudo energy. So far the pseudo energy was treated as a defect disturbing the convergence of these Boltzmann machines. Various simulations suggest that convergence can be ensured if one reduces the degree of parallelism. Another approach that we follow i...
متن کاملBoltzmann machines and energy-based models
We review Boltzmann machines and energy-based models. A Boltzmann machine defines a probability distribution over binary-valued patterns. One can learn parameters of a Boltzmann machine via gradient based approaches in a way that log likelihood of data is increased. The gradient and Laplacian of a Boltzmann machine admit beautiful mathematical representations, although computing them is in gene...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2000